An Incentive Compatible Multi-Armed-Bandit Crowdsourcing Mechanism with Quality Assurance

نویسندگان

  • Shweta Jain
  • Sujit Gujar
  • Satyanath Bhat
  • Onno Zoeter
  • Y. Narahari
چکیده

Consider a requester who wishes to crowdsource a series of identical binary labeling tasks from a pool of workers so as to achieve an assured accuracy for each task, in a cost optimal way. The workers are heterogeneous with unknown but fixed qualities and moreover their costs are private. The problem is to select an optimal subset of the workers to work on each task so that the outcome obtained from aggregating labels from them guarantees a target accuracy. This problem is challenging because the requester not only has to learn the qualities of the workers but also elicit their true costs. We develop a novel multi-armed bandit (MAB) mechanism for solving this problem. We propose a framework, Assured Accuracy Bandit (AAB), which leads to an adaptive, exploration separated MAB algorithm, Strategic Constrained Confidence Bound (CCB-S). We derive an upper bound on the number of exploration steps which depends on the target accuracy and true qualities. We show that our CCB-S algorithm produces an ex-post monotone allocation rule which can be transformed into an ex-post incentive compatible and ex-post individually rational mechanism that learns qualities of the workers and guarantees the target accuracy in a cost optimal way.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A quality assuring multi-armed bandit crowdsourcing mechanism with incentive compatible learning

We develop a novel multi-armed bandit (MAB) mechanism for the problem of selecting a subset of crowd workers to achieve an assured accuracy for each binary labelling task in a cost optimal way. This problem is challenging because workers have unknown qualities and strategic costs.

متن کامل

A Truthful Budget Feasible Multi-Armed Bandit Mechanism for Crowdsourcing Time Critical Tasks

Motivated by allocation and pricing problems faced by service requesters on modern crowdsourcing platforms, we study a multi-armed bandit (MAB) problem with several realworld features: (a) the requester wishes to crowdsource a number of tasks but has a fixed budget which leads to a trade-off between cost and quality while allocating tasks to workers; (b) each task has a fixed deadline and a wor...

متن کامل

A Multiarmed Bandit Incentive Mechanism for Crowdsourcing Demand Response in Smart Grids

Demand response is a critical part of renewable integration and energy cost reduction goals across the world. Motivated by the need to reduce costs arising from electricity shortage and renewable energy fluctuations, we propose a novel multiarmed bandit mechanism for demand response (MAB-MDR) which makes monetary offers to strategic consumers who have unknown response characteristics, to inceti...

متن کامل

A Dominant Strategy Truthful, Deterministic Multi-Armed Bandit Mechanism with Logarithmic Regret

Stochastic multi-armed bandit (MAB) mechanisms are widely used in sponsored search auctions, crowdsourcing, online procurement, etc. Existing stochastic MAB mechanisms with a deterministic payment rule, proposed in the literature, necessarily suffer a regret of Ω(T ), where T is the number of time steps. This happens because the existing mechanisms consider the worst case scenario where the mea...

متن کامل

An Optimal Bidimensional Multi-Armed Bandit Auction for Multi-unit Procurement

We study the problem of a buyer (aka auctioneer) who gains stochastic rewards by procuring multiple units of a service or item from a pool of heterogeneous strategic agents. The reward obtained for a single unit from an allocated agent depends on the inherent quality of the agent; the agent’s quality is fixed but unknown. Each agent can only supply a limited number of units (capacity of the age...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1406.7157  شماره 

صفحات  -

تاریخ انتشار 2014